Welcome to BASS!

Version: Beta 2.0

Created by Abigail Dobyns and Ryan Thorpe

BASS: Biomedical Analysis Software Suite for event detection and signal processing.
Copyright (C) 2015  Abigail Dobyns

This program is free software: you can redistribute it and/or modify
it under the terms of the GNU General Public License as published by
the Free Software Foundation, either version 3 of the License, or
(at your option) any later version.

This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
GNU General Public License for more details.

You should have received a copy of the GNU General Public License
along with this program.  If not, see <http://www.gnu.org/licenses/>

Initalize

Run the following code block to intialize the program.

run this block one time


In [1]:
from BASS import *


---------------------------------------------------------------------------
ImportError                               Traceback (most recent call last)
<ipython-input-1-7657dcebebbc> in <module>()
----> 1 from BASS import *

ImportError: No module named BASS

Instructions

For help, check out the wiki: Protocol

Or the video tutorial: Coming Soon!

1) Load Data File(s)

Use the following block to create a BASS_Dataset object and initialize your settings. All settings are attributes of the dataset instance. Manual initialization of settings in this block is optional and is required only once for a given batch. All BASS_Dataset objects that are initialized are automatically added to the batch.

class BASS_Dataset(inputDir, fileName, outputDir, fileType='plain', timeScale='seconds')

    Attributes:

        Batch: static list
            Contains all instances of the BASS_Dataset object in order to be referenced by the global runBatch function.
        Data: library
            instance data
        Settings: library
            instance settings
        Results: library
            instance results

    Methods:

        run_analysis(settings = self.Settings, analysis_module): BASS_Dataset method
            Highest level of the object-oriented analysis pipeline. First syncs the settings of all BASS_Dataset objects 
            (stored in Batch), then runs the specified analysis module on each one.

Analysis must be called after the object is initialize and Settings added if the Settings are to be added manually (not via the interactive check and load settings function). Analysis runs according to batch-oriented protocol and is specific to the analysis module determined by the "analysis_module" parameter.

2) Run Analysis

Run BASS_Dataset.run_analysis(analysis_mod, settings, batch)

Runs in either single (batch=False) or batch mode. Assuming batch mode, this function first syncs settings of each dataset within Bass_Dataset.Batch to the entered parameter "settings", then runs analysis on each instance within Batch. Be sure to select the correct module given your desired type of analysis. The current options (as of 9/21/16) are "ekg" and "pleth". Parameters are as follows:

    Parameters:
        analysis_mod: string
            the name of the BASS_Dataset module which will be used to analyze the batch of datasets
        settings: string or dictionary
            can be entered as the location of a settings file or the actual settings dictionary (default = self.Settings)
        batch: boolean
            determines if the analysis is performed on only the self-instance or as a batch on all object instances 
            (default=True)


More Info on Settings

For more information about other settings, go to:

Transforming Data

Baseline Settings

Peak Detection Settings

Burst Detection Settings


In [2]:
#Import and Initialize
#Class BASS_Dataset(inputDir, fileName, outputDir, fileType='plain', timeScale='seconds')
data1 = BASS_Dataset('C:\\Users\\Ryan\\Desktop\\sample_data\\','2016-06-22_C1a_P3_Base.txt','C:\\Users\\Ryan\\Desktop\\bass_output\\pleth')
data2 = BASS_Dataset('C:\\Users\\Ryan\\Desktop\\sample_data\\','2016-06-26_C1a_P7_Base.txt','C:\\Users\\Ryan\\Desktop\\bass_output\\pleth')
    
#transformation Settings
data1.Settings['Absolute Value'] = False #Must be True if Savitzky-Golay is being used

data1.Settings['Bandpass Highcut'] = 12 #in Hz
data1.Settings['Bandpass Lowcut'] = 1 #in Hz
data1.Settings['Bandpass Polynomial'] = 1 #integer

data1.Settings['Linear Fit'] = False #between 0 and 1 on the whole time series
data1.Settings['Linear Fit-Rolling R'] = 0.5 #between 0 and 1
data1.Settings['Linear Fit-Rolling Window'] = 1000 #window for rolling mean for fit, unit is index not time
data1.Settings['Relative Baseline'] = 0 #default 0, unless data is normalized, then 1.0. Can be any float

data1.Settings['Savitzky-Golay Polynomial'] = 'none' #integer
data1.Settings['Savitzky-Golay Window Size'] = 'none' #must be odd. units are index not time

#Baseline Settings
data1.Settings['Baseline Type'] = r'rolling' #'linear', 'rolling', or 'static'
#For Linear
data1.Settings['Baseline Start'] = None #start time in seconds
data1.Settings['Baseline Stop'] = None #end time in seconds
#For Rolling
data1.Settings['Rolling Baseline Window'] = 5 # in seconds. leave as 'none' if linear or static

#Peaks
data1.Settings['Delta'] = 0.05
data1.Settings['Peak Minimum'] = -0.50 #amplitude value
data1.Settings['Peak Maximum'] = 0.50 #amplitude value

#Bursts
data1.Settings['Apnea Factor'] = 2 #factor to define apneas as a function of expiration
data1.Settings['Burst Area'] = True #calculate burst area
data1.Settings['Exclude Edges'] = True #False to keep edges, True to discard them
data1.Settings['Inter-event interval minimum (time-scale units)'] = 0.0001 #only for bursts, not for peaks
data1.Settings['Maximum Burst Duration (time-scale units)'] = 6
data1.Settings['Minimum Burst Duration (time-scale units)'] = 0.0001
data1.Settings['Minimum Peak Number'] = 1 #minimum number of peaks/burst, integer
data1.Settings['Threshold']= 0.0001 #linear: proportion of baseline. 
                              #static: literal value.
                              #rolling, linear ammount grater than rolling baseline at each time point.

#Outputs
data1.Settings['Generate Graphs'] = False #create and save the fancy graph outputs

#Settings that you should not change unless you are a super advanced user:
#These are settings that are still in development
data1.Settings['Graph LCpro events'] = False
############################################################################################
data1.run_analysis('pleth', batch=False)


############    2016-06-22_C1a_P3_Base.txt    ############

Made plots folder
Data Loaded
Rounded Sampling Rate (s/frame): 0.001
2016-06-22_C1a_P3_Base.txt is 301.899 seconds long.

############    2016-06-26_C1a_P7_Base.txt    ############

Made plots folder
Data Loaded
Rounded Sampling Rate (s/frame): 0.001
2016-06-26_C1a_P7_Base.txt is 316.549 seconds long.
All primary-analysis settings have been initialized
Transformation completed
Baseline set completed
Peak Detection completed
Burst Detection completed
Analysis Complete:  12.6835  Seconds

--------------------------------------------
Data Column Names/Keys
-----
2016-06-22_C1a_P3_Base.txt

--------------------------------------------
Available Measurements from Peaks for further analysis:
-----
Peaks Amplitude
Intervals

--------------------------------------------
Available Measurements from Bursts for further analysis:
-----
Burst Start
Burst End
Burst Duration
Burst Start Amplitude
Burst End Amplitude
Edge Event
Interburst Interval
Total Cycle Time
Peaks per Burst
Peak Amp
Peak Time
Attack
Decay
Intraburst Frequency
Burst Area

---------------------------
|Event Detection Complete!|
---------------------------
Do you wish to re-run peak and burst detection? (y/n) n
All Bursts measurements analyzed.
Total Cycle Time Count
     2016-06-22_C1a_P3_Base.txt
0                           0.0
30                          0.0
60                          0.0
90                          0.0
120                         0.0
150                         0.0
180                         0.0
210                         0.0
240                         0.0
270                         0.0
300                        42.0
330                        57.0
360                        49.0
390                        45.0
420                        51.0
450                        49.0
480                        42.0
510                        47.0
540                        46.0
Total Cycle Time Mean
     2016-06-22_C1a_P3_Base.txt
0                      0.000000
30                     0.000000
60                     0.000000
90                     0.000000
120                    0.000000
150                    0.000000
180                    0.000000
210                    0.000000
240                    0.000000
270                    0.000000
300                    0.171429
330                    0.201544
360                    0.169796
390                    0.212356
420                    0.205353
450                    0.175429
480                    0.177571
510                    0.203830
540                    0.200043
Total Cycle Time Std
     2016-06-22_C1a_P3_Base.txt
0                      0.000000
30                     0.000000
60                     0.000000
90                     0.000000
120                    0.000000
150                    0.000000
180                    0.000000
210                    0.000000
240                    0.000000
270                    0.000000
300                    0.112276
330                    0.129965
360                    0.093723
390                    0.127654
420                    0.121268
450                    0.079636
480                    0.100484
510                    0.111712
540                    0.115639
Pleth Analysis Complete: 11.2652 sec

OPTIONAL GRAPHS AND ANALYSIS

The following blocks are optional calls to other figures and analysis

Display Event Detection Tables

Display Settings used for analysis


In [243]:
display_settings(Settings)


Out[243]:
Value
Absolute Value False
Apnea Factor 2
Bandpass Highcut False
Bandpass Lowcut False
Bandpass Polynomial False
Baseline Start 0
Baseline Stop 1
Baseline Type rolling
Burst Area True
Delta 0.15
Exclude Edges True
File Type Plain
Generate Graphs False
Graph LCpro events False
Inter-event interval minimum (seconds) 0.01
Label 2015-08-06-rat7-pleth-24hrsPostIPSel3.txt
Linear Fit False
Linear Fit-Rolling R 0.75
Linear Fit-Rolling Window 1000
Maximum Burst Duration (s) 0.5
Milliseconds False
Minimum Burst Duration (s) 0.01
Minimum Peak Number 1
Output Folder C:\Users\smurray\Desktop\Pleth Data Lab Chart\...
Peak Maximum -0.4
Peak Minimum -1.5
Relative Baseline 0
Rolling Baseline Window 2.5
Sample Rate (s/frame) 0.00025
Savitzky-Golay Polynomial none
Savitzky-Golay Window Size none
Threshold 0.01
folder C:\Users\smurray\Desktop\Pleth Data Lab Chart\...
plots folder C:\Users\smurray\Desktop\Pleth Data Lab Chart\...

Display Summary Results for Peaks


In [3]:
#grouped summary for peaks
Results['Peaks-Master'].groupby(level=0).describe()


Out[3]:
Peaks Amplitude Intervals
2016-2-10-mouse4-metformin-group1.txt count 2912.000000 2911.000000
mean 0.676046 0.320196
std 0.072286 0.208912
min 0.511200 0.080000
25% 0.632125 0.167000
50% 0.656400 0.288000
75% 0.704300 0.406500
max 1.036000 1.866000

Display Summary Results for Bursts


In [58]:
#grouped summary for bursts
Results['Bursts-Master'].groupby(level=0).describe()


Out[58]:
Burst Start Burst End Burst Duration Burst Start Amplitude Burst End Amplitude Edge Event Interburst Interval Total Cycle Time Peaks per Burst Peak Amp Peak Time Attack Decay Intraburst Frequency Burst Area
plethneg.txt count 1557.000000 1557.000000 1557.000000 1557.000000 1557.000000 1557 1557.000000 1557.000000 1557 1557.000000 1557.000000 1557.000000 1557.000000 1557.000000 1557.000000
mean 712.696051 712.739542 0.043491 -15.792990 -15.734354 0 0.109395 0.152885 1 -13.657823 712.717958 0.021907 0.021584 24.102735 -0.627847
std 96.541692 96.542019 0.010693 2.263578 2.263499 0 0.091221 0.093783 0 2.397434 96.542205 0.007716 0.007325 4.975687 0.181741
min 550.158500 550.189500 0.024250 -21.812083 -21.804533 False 0.025250 0.058250 1 -20.300000 550.172750 0.004250 0.003750 5.524862 -2.342260
25% 625.328250 625.370500 0.036500 -17.524367 -17.481967 0 0.061500 0.101500 1 -15.410000 625.347500 0.016750 0.017750 20.618557 -0.721927
50% 718.691000 718.744500 0.042000 -15.846600 -15.811533 0 0.078750 0.124750 1 -13.700000 718.707750 0.020500 0.020750 23.809524 -0.595577
75% 793.907750 793.966250 0.048500 -14.092733 -14.012883 0 0.123750 0.169750 1 -12.110000 793.940250 0.025250 0.024000 27.397260 -0.500782
max 878.754500 878.802250 0.181000 -8.446017 -8.115333 False 1.072750 1.117500 1 -4.610000 878.784250 0.075250 0.143750 41.237113 -0.232644

Interactive Graphs

Line Graphs

One pannel, detected events

Plot one time series by calling it's name


In [3]:
#Interactive, single time series by Key
key = Settings['Label']
graph_ts(Data, Settings, Results, key)

Two pannel

Create line plots of the raw data as well as the data analysis.

Plots are saved by clicking the save button in the pop-up window with your graph.

key = 'Mean1'
start =100 
end= 101

Results Line Plot


In [5]:
key = Settings['Label']
start =550 #start time in seconds
end= 560#end time in seconds
results_timeseries_plot(key, start, end, Data, Settings, Results)

Autocorrelation

Display the Autocorrelation plot of your transformed data.

Choose the start and end time in seconds. to capture whole time series, use end = -1. May be slow

key = 'Mean1'
start = 0 
end = 10

Autocorrelation Plot


In [ ]:
#autocorrelation
key = Settings['Label']
start = 0 #seconds, where you want the slice to begin
end = 1 #seconds, where you want the slice to end.
autocorrelation_plot(Data['trans'][key][start:end])
plt.show()

Raster Plot

Shows the temporal relationship of peaks in each column. Auto scales. Display only. Intended for more than one column of data


In [ ]:
#raster
raster(Data, Results)

Frequency Plot

Use this block to plot changes of any measurement over time. Does not support 'all'. Example:

event_type = 'Peaks'
meas = 'Intervals'
key = 'Mean1'

Frequency Plot


In [5]:
event_type = 'Peaks'
meas = 'Intervals'
key = Settings['Label']
frequency_plot(event_type, meas, key, Data, Settings, Results)

Analyze Events by Measurement

Generates a line plot with error bars for a given event measurement. X axis is the names of each time series. Display Only. Intended for more than one column of data. This is not a box and whiskers plot.

event_type = 'peaks'
meas = 'Peaks Amplitude'

Analyze Events by Measurement


In [ ]:
#Get average plots, display only
event_type = 'Peaks'
meas = 'Intervals'
average_measurement_plot(event_type, meas,Results)

Poincare Plots

Create a Poincare Plot of your favorite varible. Choose an event type (Peaks or Bursts), measurement type. Calling meas = 'All' is supported.

Plots and tables are saved automatically

Example:

event_type = 'Bursts'
meas = 'Burst Duration'

More on Poincare Plots

Batch Poincare

Batch Poincare


In [81]:
#Batch
event_type = 'Bursts'
meas = 'Total Cycle Time'
Results = poincare_batch(event_type, meas, Data, Settings, Results)
pd.concat({'SD1':Results['Poincare SD1'],'SD2':Results['Poincare SD2']})


Out[81]:
Attack Decay Total Cycle Time
SD1 pleth.txt 0.005717 0.005968 0.117689
SD2 pleth.txt 0.009136 0.006769 0.140370

Quick Poincare Plot

Quickly call one poincare plot for display. Plot and Table are not saved automatically. Choose an event type (Peaks or Bursts), measurement type, and key. Calling meas = 'All' is not supported.

Quick Poincare


In [6]:
#quick
event_type = 'Bursts'
meas = 'Attack'
key = Settings['Label']
poincare_plot(Results[event_type][key][meas])


Attack results:
SD1 = 0.0257 s
SD2 = 0.0311 s

Power Spectral Density

The following blocks allows you to asses the power of event measuments in the frequency domain. While you can call this block on any event measurement, it is intended to be used on interval data (or at least data with units in seconds). Reccomended:

event_type = 'Bursts'
meas = 'Total Cycle Time'
key = 'Mean1'
scale = 'raw'

event_type = 'Peaks'
meas = 'Intervals'
key = 'Mean1'
scale = 'raw'

Because this data is in the frequency domain, we must interpolate it in order to perform a FFT on it. Does not support 'all'.

Power Spectral Density: Events

Events

Use the code block below to specify your settings for event measurment PSD.


In [18]:
Settings['PSD-Event'] = Series(index = ['Hz','ULF', 'VLF', 'LF','HF','dx'])
#Set PSD ranges for power in band

Settings['PSD-Event']['hz'] = 100 #freqency that the interpolation and PSD are performed with.
Settings['PSD-Event']['ULF'] = 1 #max of the range of the ultra low freq band. range is 0:ulf
Settings['PSD-Event']['VLF'] = 2 #max of the range of the very low freq band. range is ulf:vlf
Settings['PSD-Event']['LF'] = 5 #max of the range of the low freq band. range is vlf:lf
Settings['PSD-Event']['HF'] = 50 #max of the range of the high freq band. range is lf:hf. hf can be no more than (hz/2)
Settings['PSD-Event']['dx'] = 10 #segmentation for the area under the curve.

In [19]:
event_type = 'Peaks'
meas = 'Intervals'
key = Settings['Label']
scale = 'raw'
Results = psd_event(event_type, meas, key, scale, Data, Settings, Results)
Results['PSD-Event'][key]


Out[19]:
Intervals
ULF 0.0328843
VLF 0.00711321
LF 0.000481597
HF 1.65282e-06
LF/HF 291.378
Scale s^2/Hz

Time Series

Use the settings code block to set your frequency bands to calculate area under the curve. This block is not required. band output is always in raw power, even if the graph scale is dB/Hz.

Power Spectral Density: Signal


In [ ]:
#optional
Settings['PSD-Signal'] = Series(index = ['ULF', 'VLF', 'LF','HF','dx'])

#Set PSD ranges for power in band
Settings['PSD-Signal']['ULF'] = 25 #max of the range of the ultra low freq band. range is 0:ulf
Settings['PSD-Signal']['VLF'] = 75 #max of the range of the very low freq band. range is ulf:vlf
Settings['PSD-Signal']['LF'] = 150 #max of the range of the low freq band. range is vlf:lf
Settings['PSD-Signal']['HF'] = 300 #max of the range of the high freq band. range is lf:hf. hf can be no more than (hz/2) where hz is the sampling frequency
Settings['PSD-Signal']['dx'] = 2 #segmentation for integration of the area under the curve.

Use the block below to generate the PSD graph and power in bands results (if selected). scale toggles which units to use for the graph:

raw = s^2/Hz
db = dB/Hz = 10*log10(s^2/Hz)

Graph and table are automatically saved in the PSD-Signal subfolder.


In [ ]:
scale = 'raw' #raw or db
Results = psd_signal(version = 'original', key = 'Mean1', scale = scale, 
                     Data = Data, Settings = Settings, Results = Results)
Results['PSD-Signal']

Spectrogram

Use the block below to get the spectrogram of the signal. The frequency (y-axis) scales automatically to only show 'active' frequencies. This can take some time to run.

version = 'original'
key = 'Mean1'

After transformation is run, you can call version = 'trans'. This graph is not automatically saved.

Spectrogram


In [ ]:
version = 'original'
key = Settings['Label']
spectogram(version, key, Data, Settings, Results)

Descriptive Statistics

Moving/Sliding Averages, Standard Deviation, and Count

Generates the moving mean, standard deviation, and count for a given measurement across all columns of the Data in the form of a DataFrame (displayed as a table). Saves out the dataframes of these three results automatically with the window size in the name as a .csv. If meas == 'All', then the function will loop and produce these tables for all measurements.

event_type = 'Peaks'
meas = 'all'
window = 30

Moving Stats


In [93]:
#Moving Stats
event_type = 'Bursts'
meas = 'Total Cycle Time'
window = 30 #seconds
Results = moving_statistics(event_type, meas, window, Data, Settings, Results)


Total Cycle Time Count
     pleth.txt
0            0
30           0
60           0
90           0
120          0
150          0
180          0
210          0
240          0
270          0
300          0
330          0
360          0
390          0
420          0
450          0
480          0
510          0
540        119
570        166
600        119
630         99
660        128
690        146
720        208
750        122
780        116
810        114
840        170
Total Cycle Time Mean
     pleth.txt
0     0.000000
30    0.000000
60    0.000000
90    0.000000
120   0.000000
150   0.000000
180   0.000000
210   0.000000
240   0.000000
270   0.000000
300   0.000000
330   0.000000
360   0.000000
390   0.000000
420   0.000000
450   0.000000
480   0.000000
510   0.000000
540   0.140704
570   0.149724
600   0.217416
630   0.229639
660   0.201664
690   0.163940
720   0.133573
750   0.184018
780   0.170933
810   0.159822
840   0.155459
Total Cycle Time Std
     pleth.txt
0     0.000000
30    0.000000
60    0.000000
90    0.000000
120   0.000000
150   0.000000
180   0.000000
210   0.000000
240   0.000000
270   0.000000
300   0.000000
330   0.000000
360   0.000000
390   0.000000
420   0.000000
450   0.000000
480   0.000000
510   0.000000
540   0.082745
570   0.112751
600   0.187068
630   0.200834
660   0.201966
690   0.067585
720   0.082877
750   0.104436
780   0.084920
810   0.076713
840   0.120919

Entropy

Histogram Entropy

Calculates the histogram entropy of a measurement for each column of data. Also saves the histogram of each. If meas is set to 'all', then all available measurements from the event_type chosen will be calculated iteratevely.

If all of the samples fall in one bin regardless of the bin size, it means we have the most predictable sitution and the entropy is 0. If we have uniformly dist function, the max entropy will be 1

event_type = 'Bursts'
meas = 'all'

Histogram Entropy


In [82]:
#Histogram Entropy
event_type = 'Bursts'
meas = 'all'
Results = histent_wrapper(event_type, meas, Data, Settings, Results)
Results['Histogram Entropy']


All Bursts measurements analyzed.
Out[82]:
Burst Start Burst End Burst Duration Burst Start Amplitude Burst End Amplitude Edge Event Interburst Interval Total Cycle Time Peaks per Burst Peak Amp Peak Time Attack Decay Intraburst Frequency Burst Area
pleth.txt 0.973718 0.973718 0.590168 0.858843 0.850922 0 0.505385 0.520613 0 0.827021 0.973718 0.67141 0.535883 0.805133 0.630351

Approximate entropy

this only runs if you have pyeeg.py in the same folder as this notebook and bass.py. WARNING: THIS FUNCTION RUNS SLOWLY

run the below code to get the approximate entropy of any measurement or raw signal. Returns the entropy of the entire results array (no windowing). I am using the following M and R values:

M = 2  
R = 0.2*std(measurement)

these values can be modified in the source code. alternatively, you can call ap_entropy directly. supports 'all'

Interpretation: A time series containing many repetitive patterns has a relatively small ApEn; a less predictable process has a higher ApEn.

Approximate Entropy in BASS

Aproximate Entropy Source

Events


In [ ]:
#Approximate Entropy
event_type = 'Peaks'
meas = 'Intervals'
Results = ap_entropy_wrapper(event_type, meas, Data, Settings, Results)
Results['Approximate Entropy']

Time Series


In [ ]:
#Approximate Entropy on raw signal
#takes a VERY long time
from pyeeg import ap_entropy

version = 'original' #original, trans, shift, or rolling
key = Settings['Label'] #Mean1 default key for one time series
start = 0 #seconds, where you want the slice to begin
end = 1 #seconds, where you want the slice to end. The absolute end is -1

ap_entropy(Data[version][key][start:end].tolist(), 2, (0.2*np.std(Data[version][key][start:end])))

Sample Entropy

this only runs if you have pyeeg.py in the same folder as this notebook and bass.py. WARNING: THIS FUNCTION RUNS SLOWLY

run the below code to get the sample entropy of any measurement. Returns the entropy of the entire results array (no windowing). I am using the following M and R values:

M = 2  
R = 0.2*std(measurement)

these values can be modified in the source code. alternatively, you can call samp_entropy directly. Supports 'all'

Sample Entropy in BASS

Sample Entropy Source

Events


In [73]:
#Sample Entropy
event_type = 'Bursts'
meas = 'Total Cycle Time'
Results = samp_entropy_wrapper(event_type, meas, Data, Settings, Results)
Results['Sample Entropy']


Out[73]:
Attack Decay Total Cycle Time
pleth.txt 1.694886 1.770121 0.796896

In [74]:
Results['Sample Entropy']['Attack']


Out[74]:
pleth.txt    1.694886
Name: Attack, dtype: float64

Time Series


In [ ]:
#on raw signal
#takes a VERY long time
version = 'original' #original, trans, shift, or rolling
key = Settings['Label']
start = 0 #seconds, where you want the slice to begin
end = 1 #seconds, where you want the slice to end. The absolute end is -1

samp_entropy(Data[version][key][start:end].tolist(), 2, (0.2*np.std(Data[version][key][start:end])))

Helpful Stuff

While not completely up to date with some of the new changes, the Wiki can be useful if you have questions about some of the settings: https://github.com/drcgw/SWAN/wiki/Tutorial

More Help?

Stuck on a particular step or function? Try typing the function name followed by two ??. This will pop up the docstring and source code. You can also call help() to have the notebook print the doc string.

Example:
analyze??
help(analyze)

In [ ]:
help(moving_statistics)

In [ ]:
moving_statistics??

Blank Code Block

you're still here, reading? you must be a dedicated super user!

If that is the case, then you must know how to code in Python. Use this space to get crazy with your own advanced analysis and stuff.

Blank Code Block


In [ ]: